87 research outputs found

    Bayesian information-theoretic calibration of patient-specific radiotherapy sensitivity parameters for informing effective scanning protocols in cancer

    Full text link
    With new advancements in technology, it is now possible to collect data for a variety of different metrics describing tumor growth, including tumor volume, composition, and vascularity, among others. For any proposed model of tumor growth and treatment, we observe large variability among individual patients' parameter values, particularly those relating to treatment response; thus, exploiting the use of these various metrics for model calibration can be helpful to infer such patient-specific parameters both accurately and early, so that treatment protocols can be adjusted mid-course for maximum efficacy. However, taking measurements can be costly and invasive, limiting clinicians to a sparse collection schedule. As such, the determination of optimal times and metrics for which to collect data in order to best inform proper treatment protocols could be of great assistance to clinicians. In this investigation, we employ a Bayesian information-theoretic calibration protocol for experimental design in order to identify the optimal times at which to collect data for informing treatment parameters. Within this procedure, data collection times are chosen sequentially to maximize the reduction in parameter uncertainty with each added measurement, ensuring that a budget of nn high-fidelity experimental measurements results in maximum information gain about the low-fidelity model parameter values. In addition to investigating the optimal temporal pattern for data collection, we also develop a framework for deciding which metrics should be utilized at each data collection point. We illustrate this framework with a variety of toy examples, each utilizing a radiotherapy treatment regimen. For each scenario, we analyze the dependence of the predictive power of the low-fidelity model upon the measurement budget

    Effective Dose Fractionation Schemes of Radiotherapy for Prostate Cancer

    Get PDF
    Radiation therapy remains as one of the main cancer treatment modalities. Typical regimens for radiotherapy comprise a constant dose administered on weekdays, and no radiation on weekends. In this paper, we examine adaptive dosages of radiation treatment strategies for heterogeneous tumors using a dynamical system model that consist of radiation-resistant and parental populations with unique interactive properties, namely, PC3 and DU145 prostate cancer cell lines. We show that stronger doses of radiation given in longer time intervals, while keeping the overall dosage the same, are effective in PC3 cell lines, but not in DU145 cell lines. In addition, we tested an adaptive dosing schedule by administering a stronger dosage on Friday to compensate for the treatment-off period during the weekend, which was effective in decreasing the final tumor volume of both cell lines. This result creates interesting possibilities for new radiotherapy strategies at clinics that cannot provide treatment on weekends

    Designing experimental conditions to use the Lotka-Volterra model to infer tumor cell line interaction types

    Full text link
    The Lotka-Volterra model is widely used to model interactions between two species. Here, we generate synthetic data mimicking competitive, mutualistic and antagonistic interactions between two tumor cell lines, and then use the Lotka-Volterra model to infer the interaction type. Structural identifiability of the Lotka-Volterra model is confirmed, and practical identifiability is assessed for three experimental designs: (a) use of a single data set, with a mixture of both cell lines observed over time, (b) a sequential design where growth rates and carrying capacities are estimated using data from experiments in which each cell line is grown in isolation, and then interaction parameters are estimated from an experiment involving a mixture of both cell lines, and (c) a parallel experimental design where all model parameters are fitted to data from two mixtures simultaneously. In addition to assessing each design for practical identifiability, we investigate how the predictive power of the model-i.e., its ability to fit data for initial ratios other than those to which it was calibrated-is affected by the choice of experimental design. The parallel calibration procedure is found to be optimal and is further tested on in silico data generated from a spatially-resolved cellular automaton model, which accounts for oxygen consumption and allows for variation in the intensity level of the interaction between the two cell lines. We use this study to highlight the care that must be taken when interpreting parameter estimates for the spatially-averaged Lotka-Volterra model when it is calibrated against data produced by the spatially-resolved cellular automaton model, since baseline competition for space and resources in the CA model may contribute to a discrepancy between the type of interaction used to generate the CA data and the type of interaction inferred by the LV model.Comment: 25 pages, 18 figure

    Genomics Data Analysis via Spectral Shape and Topology

    Full text link
    Mapper, a topological algorithm, is frequently used as an exploratory tool to build a graphical representation of data. This representation can help to gain a better understanding of the intrinsic shape of high-dimensional genomic data and to retain information that may be lost using standard dimension-reduction algorithms. We propose a novel workflow to process and analyze RNA-seq data from tumor and healthy subjects integrating Mapper and differential gene expression. Precisely, we show that a Gaussian mixture approximation method can be used to produce graphical structures that successfully separate tumor and healthy subjects, and produce two subgroups of tumor subjects. A further analysis using DESeq2, a popular tool for the detection of differentially expressed genes, shows that these two subgroups of tumor cells bear two distinct gene regulations, suggesting two discrete paths for forming lung cancer, which could not be highlighted by other popular clustering methods, including t-SNE. Although Mapper shows promise in analyzing high-dimensional data, building tools to statistically analyze Mapper graphical structures is limited in the existing literature. In this paper, we develop a scoring method using heat kernel signatures that provides an empirical setting for statistical inferences such as hypothesis testing, sensitivity analysis, and correlation analysis.Comment: 21 pages and 10 figure

    Cellular Models of Aggregation-Dependent Template-Directed Proteolysis to Characterize Tau Aggregation Inhibitors for Treatment of Alzheimer's Disease

    Get PDF
    Copyright © 2015, The American Society for Biochemistry and Molecular Biology. Acknowledgements-We thank Drs Timo Rager and Rolf Hilfiker (Solvias, Switzerland) for polymorph analyses.Peer reviewedPublisher PD

    An adaptive information-theoretic experimental design procedure for high-to-low fidelity calibration of prostate cancer models

    Get PDF
    The use of mathematical models to make predictions about tumor growth and response to treatment has become increasingly prevalent in the clinical setting. The level of complexity within these models ranges broadly, and the calibration of more complex models requires detailed clinical data. This raises questions about the type and quantity of data that should be collected and when, in order to maximize the information gain about the model behavior while still minimizing the total amount of data used and the time until a model can be calibrated accurately. To address these questions, we propose a Bayesian information-theoretic procedure, using an adaptive score function to determine the optimal data collection times and measurement types. The novel score function introduced in this work eliminates the need for a penalization parameter used in a previous study, while yielding model predictions that are superior to those obtained using two potential pre-determined data collection protocols for two different prostate cancer model scenarios: one in which we fit a simple ODE system to synthetic data generated from a cellular automaton model using radiotherapy as the imposed treatment, and a second scenario in which a more complex ODE system is fit to clinical patient data for patients undergoing intermittent androgen suppression therapy. We also conduct a robust analysis of the calibration results, using both error and uncertainty metrics in combination to determine when additional data acquisition may be terminated

    Analyzing Collective Motion with Machine Learning and Topology

    Get PDF
    We use topological data analysis and machine learning to study a seminal model of collective motion in biology [D'Orsogna et al., Phys. Rev. Lett. 96 (2006)]. This model describes agents interacting nonlinearly via attractive-repulsive social forces and gives rise to collective behaviors such as flocking and milling. To classify the emergent collective motion in a large library of numerical simulations and to recover model parameters from the simulation data, we apply machine learning techniques to two different types of input. First, we input time series of order parameters traditionally used in studies of collective motion. Second, we input measures based in topology that summarize the time-varying persistent homology of simulation data over multiple scales. This topological approach does not require prior knowledge of the expected patterns. For both unsupervised and supervised machine learning methods, the topological approach outperforms the one that is based on traditional order parameters.Comment: Published in Chaos 29, 123125 (2019), DOI: 10.1063/1.512549

    The role of knowledge management strategies and task knowledge in stimulating service innovation

    Get PDF
    Are service firms that enact strategies to manage their new service development (NSD) knowledge able to generate a sustainable competitive advantage (SCA)? Based on analysis of data from a large survey of service companies, the answer is yes. We find that companies employing the knowledge management strategies of codification and personalization reflect higher levels of NSD knowledge. However, the two strategies vary in their individual performance outcomes, with codification promoting NSD proficiency (an ability to execute NSD activities) and personalization promoting greater NSD innovativeness (market perception of the company as novel and as an innovator). When used together, the two strategies magnify NSD knowledge, which when combined with NSD proficiency and NSD innovativeness, promote a SCA. Therefore, companies planning to invest in a knowledge management system should heed the outcomes desired from their NSD process. A system based on documentation exemplifies a codification strategy and will drive NSD proficiency; a system emphasizing interpersonal communication exemplifies a personalization strategy and will drive NSD innovativeness. A system that blends the two strategies appears the most advantageous for service companies’ NSD efforts aiming to build a long-term sustainable competitive advantage

    Effects of oxidized and reduced forms of methylthioninium in two transgenic mouse tauopathy models

    Get PDF
    Acknowledgements The authors acknowledge the contributions of Bettina Seelhorst (histological analysis), Anna Thoma (animal care), Marlene Arthur (animal dosing) and Pierre-Henri Moreau (experimental discussions). This work was supported by TauRx Therapeutics Ltd., Singapore.Peer reviewedPublisher PD
    • …
    corecore